On Adjustment Functions for Weight-Adjusted Voting-Based Ensembles of Classifiers

نویسنده

  • Kuo-Wei Hsu
چکیده

An ensemble of classifiers is a system consisting of multiple member classifiers which are trained individually and whose outcomes are aggregated into an overall outcome for a testing data instance. Voting is a common approach used to aggregate outcomes generated by member classifiers. Ensembles based on weighted voting have been studied for some time. However, the focus of most studies is more on weight assignment rather than on weight adjustment, whose basic idea is to increase the weights of votes from member classifiers performing better on data instances of higher difficulty. In this paper, we present our study on adjustment functions in each of which both the performance of a member classifier and the difficulty of a data set are determined nonlinearly. We report results from experiments conducted on several data sets, demonstrating the potential of the studied functions.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Learning to Assemble Classifiers via Genetic Programming

This article introduces a novel approach for building heterogeneous ensembles based on genetic programming (GP). Ensemble learning is a paradigm that aims at combining individual classifiers outputs to improve their performance. Commonly, classifiers outputs are combined by a weighted sum or a voting strategy. However, linear fusion functions may not effectively exploit individual models’ redun...

متن کامل

Application of ensemble learning techniques to model the atmospheric concentration of SO2

In view of pollution prediction modeling, the study adopts homogenous (random forest, bagging, and additive regression) and heterogeneous (voting) ensemble classifiers to predict the atmospheric concentration of Sulphur dioxide. For model validation, results were compared against widely known single base classifiers such as support vector machine, multilayer perceptron, linear regression and re...

متن کامل

Exploiting Diversity in Ensembles: Improving the Performance on Unbalanced Datasets

Ensembles are often capable of greater predictive performance than any of their individual classifiers. Despite the need for classifiers to make different kinds of errors, the majority voting scheme, typically used, treats each classifier as though it contributed equally to the group‘s performance. This can be particularly limiting on unbalanced datasets, as one is more interested in complement...

متن کامل

Classifier selection for majority voting

Individual classification models are recently challenged by combined pattern recognition systems, which often show better performance. In such systems the optimal set of classifiers is first selected and then combined by a specific fusion method. For a small number of classifiers optimal ensembles can be found exhaustively, but the burden of exponential complexity of such search limits its prac...

متن کامل

Bagging and Boosting with Dynamic Integration of Classifiers

One approach in classification tasks is to use machine learning techniques to derive classifiers using learning instances. The cooperation of several base classifiers as a decision committee has succeeded to reduce classification error. The main current decision committee learning approaches boosting and bagging use resampling with the training set and they can be used with different machine le...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:
  • JCP

دوره 9  شماره 

صفحات  -

تاریخ انتشار 2014